首页> 外文OA文献 >Learning Naive Bayes Tree for Conditional Probability Estimation
【2h】

Learning Naive Bayes Tree for Conditional Probability Estimation

机译:学习朴素贝叶斯树进行条件概率估计

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Na\uefve Bayes Tree uses decision tree as the general structure and deploys na\uefve Bayesian classifiers at leaves. The intuition is that na\uefve Bayesian classifiers work better than decision trees when the sample data set is small. Therefore, after several attribute splits when constructing a decision tree, it is better to use na\uefve Bayesian classifiers at the leaves than to continue splitting the attributes. In this paper, we propose a learning algorithm to improve the conditional probability estimation in the diagram of Na\uefve Bayes Tree. The motivation for this work is that, for cost-sensitive learning where costs are associated with conditional probabilities, the score function is optimized when the estimates of conditional probabilities are accurate. The additional benefit is that both the classification accuracy and Area Under the Curve (AUC) could be improved. On a large suite of benchmark sample sets, our experiments show that the CLL tree outperforms the state-of-art learning algorithms, such as Na\uefve Bayes Tree and na\uefve Bayes significantly in yielding accurate conditional probability estimation and improving classification accuracy and AUC.
机译:朴素贝叶斯树使用决策树作为一般结构,并在叶子上部署朴素贝叶斯分类器。直觉是,当样本数据集较小时,朴素贝叶斯分类器比决策树更好地工作。因此,在构造决策树时对多个属性进行拆分之后,与继续拆分属性相比,最好在叶上使用朴素贝叶斯分类器。在本文中,我们提出了一种学习算法,以改善Nauefve Bayes树图中的条件概率估计。进行这项工作的动机是,对于将成本与条件概率相关联的成本敏感型学习,当条件概率的估计值准确时,对得分函数进行优化。另一个好处是可以同时提高分类精度和曲线下面积(AUC)。在大量基准样本集上,我们的实验表明CLL树在产生准确的条件概率估计并提高分类准确性和AUC。

著录项

  • 作者

    Liang, H.; Yan, Y.;

  • 作者单位
  • 年度 2006
  • 总页数
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号